# Low-resource efficiency

Aya Vision 8b
Aya Vision 8B is an open-weight 8-billion-parameter multilingual vision-language model supporting visual and language tasks in 23 languages.
Image-to-Text Transformers Supports Multiple Languages
A
CohereLabs
29.94k
282
Qwen2.5 0.5B Portuguese V1
MIT
A Portuguese large language model fine-tuned from Qwen2.5-0.5B-Instruct, specializing in text generation tasks
Large Language Model Safetensors Other
Q
cnmoro
2,218
4
Falcon3
Apache-2.0
Falcon3-10B-Instruct is an open-source foundational model from the Falcon3 series, featuring 10 billion parameters, specializing in high-quality instruction-following tasks, supporting multilingual processing with a context length of up to 32K tokens.
Large Language Model
F
cortexso
244
1
Sat 3l Sm
MIT
State-of-the-art sentence segmentation technology using a 3-layer Transformer architecture, supporting multilingual text segmentation.
Sequence Labeling Transformers Supports Multiple Languages
S
segment-any-text
168.01k
6
Ko Llama 3 8B Instruct
Ko-Llama-3-8B-Instruct is a model specifically developed to enhance the performance of Korean language models, based on supervised fine-tuning of Meta-Llama-3-8B-Instruct.
Large Language Model Transformers Supports Multiple Languages
K
davidkim205
140
8
Nllb 200 Distilled 600M En Zh CN
This is a machine translation model fine-tuned from Meta's NLLB-200-distilled-600M model, specifically designed for English-to-Simplified Chinese translation tasks.
Machine Translation Transformers Supports Multiple Languages
N
HackerMonica
41
3
Llama 3 8B Dutch
Dutch conversation model based on Llama 3 8B, optimized via ORPO method on Dutch feedback datasets
Large Language Model Transformers Other
L
ReBatch
47
12
Snowflake Arctic Embed Xs
Snowflake Arctic Embed XS is a lightweight sentence embedding model focused on sentence similarity and feature extraction tasks.
Text Embedding Transformers
S
Snowflake
125.31k
35
Qra 1b
Apache-2.0
Qra is a series of Polish-optimized large language models jointly developed by the Polish National Information Processing Institute and Gdańsk University of Technology, initialized based on TinyLlama-1.1B and trained on 90 billion Polish tokens
Large Language Model Transformers
Q
OPI-PG
246
20
Mobilellama 1.4B Chat
Apache-2.0
MobileLLaMA-1.4B-Chat is a chat model fine-tuned from MobileLLaMA-1.4B-Base, utilizing the ShareGPT dataset for supervised instruction fine-tuning.
Large Language Model Transformers
M
mtgv
580
20
Mt5 Base Thaisum Text Summarization
A Thai text summarization model fine-tuned based on the mT5 architecture, capable of generating concise summaries between 40-140 characters
Text Generation Transformers Other
M
StelleX
178
1
Btlm 3b 8k Chat
Apache-2.0
BTLM-3B-8k-chat is a conversational version developed based on BTLM-3B-8K-base, optimized using DPO method, specifically designed for dialogue scenarios aligned with human preferences.
Large Language Model Transformers English
B
cerebras
138
13
Scandi Nli Base
Apache-2.0
A natural language inference model fine-tuned from NbAiLab/nb-bert-base, supporting Danish, Norwegian Bokmål, and Swedish
Text Classification Transformers Other
S
alexandrainst
19
1
Mt5 Small Cc News Es Titles
Apache-2.0
MT5-small-based Spanish news headline generation model, optimized for the CC-NEWS-ES dataset
Text Generation Transformers Spanish
M
LeoCordoba
26
0
SGPT 125M Weightedmean Msmarco Specb Bitfit
SGPT-125M is a sentence transformer model optimized with weighted mean and bitfit techniques, focusing on sentence similarity tasks.
Text Embedding
S
Muennighoff
4,086
2
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase